Learning to integrate arbitrary signals from vision and touch.

نویسنده

  • Marc O Ernst
چکیده

When different perceptual signals of the same physical property are integrated, for example, an objects' size, which can be seen and felt, they form a more reliable sensory estimate (e.g., M. O. Ernst & M. S. Banks, 2002). This, however, implies that the sensory system already knows which signals belong together and how they relate. In other words, the system has to know the mapping between the signals. In a Bayesian model of cue integration, this prior knowledge can be made explicit. Here, we ask whether such a mapping between two arbitrary sensory signals from vision and touch can be learned from their statistical co-occurrence such that they become integrated. In the Bayesian framework, this means changing the belief about the distribution of the stimuli. To this end, we trained subjects with stimuli that are usually unrelated in the world--the luminance of an object (visual signal) and its stiffness (haptic signal). In the training phase, we then presented subjects with combinations of these two signals, which were artificially correlated, and thus, we introduced a new mapping between them. For example, the stiffer the object, the brighter it was. We measured the influence of learning by comparing discrimination performance before and after training. The prediction is that integration makes discrimination worse for stimuli, which are incongruent with the newly learned mapping, because integration would cause this incongruency to disappear perceptually. The more certain subjects are about the new mapping, the stronger should the influence be on discrimination performance. Thus, learning in this context is about acquiring beliefs. We found a significant change in discrimination performance before and after training when comparing trials with congruent and incongruent stimuli. After training, discrimination thresholds for the incongruent stimuli are increased relative to thresholds for congruent stimuli, suggesting that subjects learned effectively to integrate the two formerly unrelated signals.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning to Combine Arbitrary Signals from Vision and Touch

When different perceptual signals of the same physical property are integrated–e.g., the size of an object, which can be seen and felt–they form a more reliable sensory estimate [3]. This however implies that the sensory system already knows which signals belong together and how they are related. In a Bayesian model of cue integration this prior knowledge can be made explicit. Here, we examine ...

متن کامل

Cross-Modal Correspondence Among Vision, Audition, and Touch in Natural Objects: An Investigation of the Perceptual Properties of Wood.

Certain systematic relationships are often assumed between information conveyed from multiple sensory modalities; for instance, a small figure and a high pitch may be perceived as more harmonious. This phenomenon, termed cross-modal correspondence, may result from correlations between multi-sensory signals learned in daily experience of the natural environment. If so, we would observe cross-mod...

متن کامل

Robot Motion Vision Pait I: Theory

A direct method called fixation is introduced for solving the general motion vision problem, arbitrary motion relative to an arbitrary environment. This method results in a linear constraint equation which explicitly expresses the rotational velocity in terms of the translational velocity. The combination of this constraint equation with the Brightness-Change Constraint Equation solves the gene...

متن کامل

Learning from vision-to-touch is different than learning from touch-to-vision

We studied whether vision can teach touch to the same extent as touch seems to teach vision. In a 2 × 2 between-participants learning study, we artificially correlated visual gloss cues with haptic compliance cues. In two "natural" tasks, we tested whether visual gloss estimations have an influence on haptic estimations of softness and vice versa. In two "novel" tasks, in which participants wer...

متن کامل

COMMENTARY Integrating Associative Learning Signals Across the Brain

Associative learning is defined as the ability to link arbitrary stimuli or actions together in memory. The neural correlates of this fundamental form of plasticity were first described in the hippocampus during delay eye blink conditioning and have since been examined using a variety of tasks in both rats and monkeys. In monkeys, the neural correlates of associative learning have been studied ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of vision

دوره 7 5  شماره 

صفحات  -

تاریخ انتشار 2007